Neural Nets with Superlinear VC-Dimension

نویسنده

  • Wolfgang Maass
چکیده

It has been known for quite a while that the Vapnik-Chervonenkis dimension (VC-dimension) of a feedforward neural net with linear threshold gates is at most O(w . log w), where w is the total number of weights in the neural net. We show in this paper that this bound is in fact asymptotically optimal. More precisely, we exhibit for any depth d 2 3 a large class of feedforward neural nets of depth d with w weights that have VC-dimension Q(w . log w). This lower bound holds even if the inputs are restricted to Boolean values. The proof of this result relies on a new method that allows us to encode more "program-bits" in the weights of a neural net than previously thought possible.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Radial Basis Function Neural Networks Have Superlinear VC Dimension

We establish superlinear lower bounds on the Vapnik-Chervonen-kis (VC) dimension of neural networks with one hidden layer and local receptive eld neurons. As the main result we show that every reasonably sized standard network of radial basis function (RBF) neurons has VC dimension (W log k), where W is the number of parameters and k the number of nodes. This signiicantly improves the previousl...

متن کامل

Product Unit Neural Networks with Constant Depth and Superlinear VC Dimension

It has remained an open question whether there exist product unit networks with constant depth that have superlinear VC dimension. In this paper we give an answer by constructing two-hidden-layer networks with this property. We further show that the pseudo dimension of a single product unit is linear. These results bear witness to the cooperative eeects on the computational capabilities of prod...

متن کامل

Neural Networks with Local Receptive Fields and Superlinear VC Dimension

Local receptive field neurons comprise such well-known and widely used unit types as radial basis function (RBF) neurons and neurons with center-surround receptive field. We study the Vapnik-Chervonenkis (VC) dimension of feedforward neural networks with one hidden layer of these units. For several variants of local receptive field neurons, we show that the VC dimension of these networks is sup...

متن کامل

Product Unit Neural Networks with

It has remained an open question whether there exist product unit networks with constant depth that have superlinear VC dimension. In this paper we give an answer by constructing two-hidden-layer networks with this property. We further show that the pseudo dimension of a single product unit is linear. These results bear witness to the cooperative eeects on the computational capabilities of prod...

متن کامل

Neural Networks with Quadratic VC Dimension

This paper shows that neural networkswhich use continuousactivation functions have VC dimension at least as large as the square of the number of weights w. This results settles a long-standing open question, namely whether the well-known O(w log w) bound, known for hardthreshold nets, also held for more general sigmoidal nets. Implications for the number of samples needed for valid generalizati...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Electronic Colloquium on Computational Complexity (ECCC)

دوره 1  شماره 

صفحات  -

تاریخ انتشار 1994